metalearned neural memory
Reviews: Metalearned Neural Memory
UPDATED I think the authors for their rebuttal comments. All my concerns have been addressed (modulo seeing the extra results / error bars) so I am raising my score to 8. The idea of parameterising the memory as a neural network, and using ideas from metalearning to quickly train it to produce a specified output for new sequences, is very interesting and novel. The paper is overall well written, and I believe should be reproducable by those familiar with metalearning approaches. The justification for the model is interesting - essentially instead of writing some values to a fixed size memory, and then reads being limited to a convex combination of the written values, using a neural network allows potential benefits with compression, as well as generalisation, with constant space. Obviously the key issue with this is whether the memory function can be easily modified in one shot so that a new set of keys and values will be'read' approximately correctly.
Metalearned Neural Memory
We augment recurrent neural networks with an external memory mechanism that builds upon recent progress in metalearning. Reading from the neural memory function amounts to pushing an input (the key vector) through the function to produce an output (the value vector). Writing to memory means changing the function; specifically, updating the parameters of the neural network to encode desired information. We leverage training and algorithmic techniques from metalearning to update the neural memory function in one shot. The proposed memory-augmented model achieves strong performance on a variety of learning problems, from supervised question answering to reinforcement learning.
Metalearned Neural Memory
Munkhdalai, Tsendsuren, Sordoni, Alessandro, WANG, TONG, Trischler, Adam
We augment recurrent neural networks with an external memory mechanism that builds upon recent progress in metalearning. Reading from the neural memory function amounts to pushing an input (the key vector) through the function to produce an output (the value vector). Writing to memory means changing the function; specifically, updating the parameters of the neural network to encode desired information. We leverage training and algorithmic techniques from metalearning to update the neural memory function in one shot. The proposed memory-augmented model achieves strong performance on a variety of learning problems, from supervised question answering to reinforcement learning.